On the capacity of Markov sources over noisy channels

نویسنده

  • Aleksandar Kavcic
چکیده

We present an expectation-maximization method for optimizing Markov process transition probabilities to increase the mutual information rate achievable when the Markov process is transmitted over a noisy finitestate machine channel. The method provides a tight lower bound on the achievable information rate of a Markov process over a noisy channel and it is conjectured that it actually maximizes this information rate. The latter statement is supported by empirical evidence (not shown in this paper) obtained through brute-force optimization methods on low-order Markov processes. The proposed expectationmaximization procedure can be used to find tight lower bounds on the capacities of finite-state machine channels (say, partial response channels) or the noisy capacities of constrained (say, run-length limited) sequences, with the bounds becoming arbitrarily tight as the memory-length of the input Markov process approaches infinity. The method links the Arimoto-Blah& algorithm to Shannon’s noise-free entropy maximization by introducing the noisy a4acency matrix. a similar stochastic expectation-maximization for Markov sources, linking the method to a noisy adjacency matrix, whose computation is shown in Section V. Section VI gives a numeric example by computing a lower bound on the capacity of run-length limited sequences over the binary symmetric channel. Section VII concludes the paper. Notation: The superscript T denotes matrix and vector transposition. Random variables are denoted by uppercase letters, while their realizations are denoted by lowercase letters. If a random variable is a member of a random sequence, an index Y?’ is used to denote time, e.g., Xt. A vector of random variables [Xi, Xi+i,. . . , XjlT is shortly denoted by Xi, while its realization is shortly denoted by z{. The letter H is used to denote the entropy, the letter I denotes mutual information, while the letter Z denotes the mutual information rate.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Information Flow and Capacity of Channels with Noisy Feedback

In this paper, we consider some long-standing problems in communication systems with access to noisy feedback. We introduce a new notion, the residual directed information, to capture the effective information flow (i.e. mutual information between the message and the channel outputs) in the forward channel. In light of this new concept, we investigate discrete memoryless channels (DMC) with noi...

متن کامل

On the Multiple Access Channel with Asymmetric Noisy State Information at the Encoders

We consider the problem of reliable communication over multiple-access channels (MAC) where the channel is driven by an independent and identically distributed state process and the encoders and the decoder are provided with various degrees of asymmetric noisy channel state information (CSI). For the case where the encoders observe causal, asymmetric noisy CSI and the decoder observes complete ...

متن کامل

Statistical mechanics of LDPC codes on channels with memory

Introduction. – A common problem in modern mobile telecommunication systems is that the strength of the signal varies over time as a result of e.g. the motion of the receiver with respect to the source and the varying number of obstacles that shadow the signal over time. Channels describing communication of attenuated signals are termed ‘fading channels’. Fading channels are modeled by finite-s...

متن کامل

Joint Source-channel Decoding of Entropy Coded Markov Sources over Binary Symmetric Channels

This paper proposes an optimal joint source-channel, maximum a posteriori, decoder for entropy coded Markov sources transmitted over noisy channels. We introduce the concept of incomplete and complete states to deal with the problem of variable length source codes in the decoder. The proposed decoder is sequential, there by making the expected delay finite. When compared to the standard Huffman...

متن کامل

The adaptive zero-error capacity for a class of channels with noisy feedback

The adaptive zero-error capacity of discrete memoryless channels (DMC) with noiseless feedback has been shown to be positive whenever there exists at least one channel output “disprover”, i.e. a channel output that cannot be reached from at least one of the inputs. Furthermore, whenever there exists a disprover, the adaptive zero-error capacity attains the Shannon (small-error) capacity. Here, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001